Regularized Orthogonal Tensor Decompositions for Multi-Relational Learning

نویسندگان

  • Fanhua Shang
  • Yuanyuan Liu
  • James Cheng
  • Hong Cheng
چکیده

Multi-relational learning has received lots of attention from researchers in various research communities. Most existing methods either suffer from superlinear per-iteration cost, or are sensitive to the given ranks. To address both issues, we propose a scalable core tensor trace norm Regularized Orthogonal Iteration Decomposition (ROID) method for full or incomplete tensor analytics, which can be generalized as a graph Laplacian regularized version by using auxiliary information or a sparse higher-order orthogonal iteration (SHOOI) version. We first induce the equivalence relation of the Schatten p-norm (0<p<∞) of a low multi-linear rank tensor and its core tensor. Then we achieve a much smaller matrix trace norm minimization problem. Finally, we develop two efficient augmented Lagrange multiplier algorithms to solve our problems with convergence guarantees. Extensive experiments using both real and synthetic datasets, even though with only a few observations, verified both the efficiency and effectiveness of our methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Introduction to Tensor Decompositions and their Applications in Machine Learning

Tensors are multidimensional arrays of numerical values and therefore generalize matrices to multiple dimensions. While tensors rst emerged in the psychometrics community in the 20th century, they have since then spread to numerous other disciplines, including machine learning. Tensors and their decompositions are especially bene cial in unsupervised learning settings, but are gaining popularit...

متن کامل

Regularized Tensor Factorizations and Higher-Order Principal Components Analysis

High-dimensional tensors or multi-way data are becoming prevalent in areas such as biomedical imaging, chemometrics, networking and bibliometrics. Traditional approaches to finding lower dimensional representations of tensor data include flattening the data and applying matrix factorizations such as principal components analysis (PCA) or employing tensor decompositions such as the CANDECOMP / P...

متن کامل

Efficient Orthogonal Tensor Decomposition, with an Application to Latent Variable Model Learning

Decomposing tensors into orthogonal factors is a well-known task in statistics, machine learning, and signal processing. We study orthogonal outer product decompositions where the factors in the summands in the decomposition are required to be orthogonal across summands, by relating this orthogonal decomposition to the singular value decompositions of the flattenings. We show that it is a non-t...

متن کامل

Coordinate-descent for learning orthogonal matrices through Givens rotations

Optimizing over the set of orthogonal matrices is a central component in problems like sparsePCA or tensor decomposition. Unfortunately, such optimization is hard since simple operations on orthogonal matrices easily break orthogonality, and correcting orthogonality usually costs a large amount of computation. Here we propose a framework for optimizing orthogonal matrices, that is the parallel ...

متن کامل

Tensor Decompositions for Learning Latent Variable Models Report Title

This work considers a computationally and statistically e?cient parameter estimation method for a wide class of latent variable models—including Gaussian mixture models, hidden Markov models, and latent Dirichlet allocation—which exploits a certain tensor structure in their loworder observable moments (typically, of secondand third-order). Speci?cally, parameter estimation is reduced to the pro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1512.08120  شماره 

صفحات  -

تاریخ انتشار 2015